Buddhi-128k-Chat is a pioneering general-purpose chat model with a 128K context window, finely tuned based on Mistral 7B Instruct and optimized through innovative YaRN technology to handle extended context lengths of up to 128,000 tokens.
Large Language Model
Transformers English